Topic Models with Sparse and Group-Sparsity Inducing Priors
نویسنده
چکیده
The quality of topic models highly depends on quality of used documents. Insufficient information may result in topics that are difficult to interpret or evaluate. Including external data to can help to increase the quality of topic models. We propose sparsity and grouped sparsity inducing priors on the meta parameters of word topic probabilities in fully Bayesian Latent Dirichlet Allocation (LDA). This enables controlled integration of information about words.
منابع مشابه
Bayesian Models for Structured Sparse Estimation via Set Cover Prior
A number of priors have been recently developed for Bayesian estimation of sparse models. In many applications the variables are simultaneously relevant or irrelevant in groups, and appropriately modeling this correlation is important for improved sample efficiency. Although group sparse priors are also available, most of them are either limited to disjoint groups, or do not infer sparsity at g...
متن کاملBayesian Sparsity for Intractable Distributions
Bayesian approaches for single-variable and group-structured sparsity outperform L1 regularization, but are challenging to apply to large, potentially intractable models. Here we show how noncentered parameterizations, a common trick for improving the efficiency of exact inference in hierarchical models, can similarly improve the accuracy of variational approximations. We develop this with two ...
متن کاملOnline Dictionary Learning with Group Structure Inducing Norms
• Sparse coding. • Structured sparsity (e.g., disjunct groups, trees): increased performance in several applications. • Our goal: develop a dictionary learning method, which – enables general overlapping group structures, – is online: fast, memory efficient, adaptive, – applies non-convex sparsity inducing regularization: ∗ fewer measurements, ∗ weaker conditions on the dictionary, ∗ robust (w....
متن کاملSparse Bayesian Multi-Task Learning
We propose a new sparse Bayesian model for multi-task regression and classification. The model is able to capture correlations between tasks, or more specifically a low-rank approximation of the covariance matrix, while being sparse in the features. We introduce a general family of group sparsity inducing priors based on matrix-variate Gaussian scale mixtures. We show the amount of sparsity can...
متن کاملOn Bayesian classification with Laplace priors
We present a new classification approach, using a variational Bayesian estimation of probit regression with Laplace priors. Laplace priors have been previously used extensively as a sparsity inducing mechanism to perform feature selection simultaneously with classification or regression. However, contrarily to the ’myth’ of sparse Bayesian learning with Laplace priors, we find that the sparsity...
متن کامل